It was suggested by a colleague that one of our web applications may have a "memory leak". An application is said to have a memory leak when it uses memory but doesn't release the memory when it is done with it. Over time, the server the application is running on will run out of memory.
With .net, (most of the time) you don't need to worry about releasing memory. Most people don't understand how and when to do it anyway. You create an object which takes up memory. The object then goes out of scope and at some indeterminate time in the future, the "garbage collector" process (part of the .net runtime) will clean up and release this memory for you.
The garbage collector will only clean up .net objects. These are called "managed resources". Sometimes our .net classes contain references to objects that are not .net. These are called "unmanaged resources". .net classes that contain references to unmanaged resources are supposed to implement the IDisposable interface. This exposes a Dispose method that will free up the unmanaged resource. You should assume that if you are instantiating an object of a class that implements IDisposable, you should call the Dispose method or declare the object with the Using keyword. The Using keyword is preferred since it will call the Dispose method even if an exception occurs while using the object. If you fail to do this, you are taking a risk of the unmanaged resource sticking around in memory forever. If you create a class that contains an instance of a class that implements IDisposable, then your class should also implement IDisposable and call the Dispose method from your Dispose method. If you follow these rules, you simply shouldn't have memory leaks.
How do we determine if there is a memory leak? The only way is to monitor memory usage of the application. With web applications, this presents challenges. Web applications can use what appears to be a lot of memory. However, this does not necessarily indicate a problem. Especially if the web application is under a heavy load. The web application will use what it needs as more users make requests. Web applications run inside a worker process for the application pool that is configured for the web site in IIS. The process won't give up the memory unless it feels "pressure" to do so by some other process. Reallocating memory takes processing power, and IIS simply won't do it until it feels it needs to. As you can see, there are going to be some fluctuations in the amount of memory that an application uses that are not related at all to a problem or a "memory leak". Another thing to keep in mind is that IIS by default will recycle each app pool approximately once a day. Recycling the app pool essentially wipes out all the memory and starts fresh. So, if the app pool is configured to recycle more often than that, you may want to adjust that so you can get a good feel of memory usage over a longer period of time. So, to determine if there is a memory leak, you cannot rely on memory usage observations in the short term. You must monitor the memory usage over a long period (a day or two). If the memory usage gradually increases throughout the period, you probably have a problem. If the increase is correlated to an increase in load, it may just be normal.
No comments:
Post a Comment