How can I exceed the 60% Memory Limit of IIS7 in ASP.NET Caching application?

The in built caching is not all that feature rich and you will struggle to get it to do much more (unless some IIS guru has some clever work about).

Up vote 2 down vote favorite share g+ share fb share tw.

Pardon if this is more serverfault vs. stackoverflow. It seems to be on the border. We have an application that caches a large amount of product data for an e-commerce application using ASP.

NET caching. This is a dictionary object with 65K elements, and our calculations put the object's size at ~10GB. Problem: The amount of memory the object consumes seems to be far in excess of our 10GB calculation.

BIGGEST CONCERN: We can't seem to use over 60% of the 32GB in the server. What we've tried so far: In machine. Config/system.

Web (sf doesn't allow the tags, pardon the formatting): processModel autoConfig="true" memoryLimit="80" In web. Config/system. Web/caching/cache (sf doesn't allow the tags, pardon the formatting): privateBytesLimit = "20000000000" (and 0, the default of course) percentagePhysicalMemoryUsedLimit = "90" Environment: Windows 2008R2 x64 32GB RAM IIS7 Nothing seems to allow us to exceed the 60% value.

See screenshot of taskman.

c# asp.net iis7 caching link|improve this question asked Jun 11 '10 at 3:13evilknot654 40% accept rate.

An educated guess: The server is re-tuning its memory to adjust for the increased workload you are throwing at it, using more of the swap file to compensate, or garbage collecting memory faster. Something like that. What does the Performance tab in Taskman look like as you ramp up the load?

Does the size of the swap file increase? – Robert Harvey? Jun 11 '10 at 3:31 @Robert: Swap stays pretty much flat (which makes sense, since it's an in-memory cache).

Worth checking though. @all: I'm wondering if the sheer size of a single object is the problem. Does the GC require a certain amount of "slack space" for shifting objects around and this one object exceeded that?

– evilknot Jun 11 '10 at 3:52 Are you swapping objects in and out of the dictionary? If you are, that could be putting pressure on the GC, as each swap will free up an object that must be disposed at some point. The GC may not wait for you to run out of memory before it performs a collection.

Some memory profiling might be in order. – Robert Harvey? Jun 11 '10 at 4:28 is it always the same point of memory usage that it fails?

Anything in the event log around the time of dying? Another random guess is memory fragmentation. If you have a debugger attached before it dies, does anything get thrown when it dies?

(OOM, for instance) – James Manning Jun 11 '10 at 6:15 @Robert Harvey: we're just populating the cache one time, when the application starts. – evilknot Jun 11 '10 at 17:01.

The in built caching is not all that feature rich and you will struggle to get it to do much more (unless some IIS guru has some clever work about). We spent a lot of time working on this and gave up. We actually use slimmer objects to store in the cache and get the fuller objects as needed.

When we have needed to contemplate this we investigated Memcached and Velocity but retreated from deploying them just yet. They are more feature rich though. Also how are you storing the items in the cache through code?

Are you putting them in there at application start or after the first request for each? The reason I ask is whether your cache keys are effective and that actually you are populating the cache over and over and not retrieving anything (this may be the case of just one object type). We managed to do this once by appending the time to a date specific cache key for example.

We're definitely exploring other caches. We're just populating the cache one time, when the application starts, and it doesn't change. We have been looking at the above, in addition to nCache.

Thanks! – evilknot Jun 11 '10 at 18:50 1 +1, asp.net cache is really wrong here -- you probably want something else and out of process. – Wyatt Barnett Feb 14 at 3:50.

A little late but I'm having nearly the same issue. The problem with the memoryLimit setting on processModel is that it just seems to have no effect despite being documented as such. PercentagePhysicalMemoryUsedLimit similarly looks like it should do something but has no effect.

PrivateBytesLimit="20000000000" does work though. I went and debugged the process and found the CacheMemorySizePressure object and it successfully picked up the value and set it to _memoryLimit. I would double check that.

Another option is setting a Private Memory Usage recycle threshold on the IIS app pool. That should also get picked up and override the default 60% limit. A third option is using the new MemoryCache class and setting the PhysicalMemoryLimit on it.

If using an out-of-proc cache server is an option, we have had good results using ScaleOut StateServer. I have not thrown as much data as you are talking about at it. But it is pretty full featured and has a .

NET API.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions