An Argument for Memory Profiling for Your .NET Applications

Why and How to Memory Profile Your .NET Applications

Every application is gauged by the memory footprint it makes, be it a mobile application or a thick application running on your desktop. Believe the statistics or not, people just love applications that are light-weight.

The world is rapidly changing and programmers are trying to make the apps that run on devices perform faster and eat less memory, both on embedded and non-embedded systems. The amount of applications that run in a terminal service or a shared web server is increasing to a great extent and vigilance in making efficient use of memory is the key to success. I say vigilance, because most of the crucial memory leaks go unnoticed at early stages due to the lack of profiling.

On the .Net frontier, it is more complicated to achieve a relatively small size foot print in memory than a native application. The main culprit is the huge size factor of the native image of the IL. Creating images through Ngen does not help in certain cases as the code is not shared between the instances of a single application too. Memory usage has a direct relation to the speed of the application; the lighter the usage, the faster it performs. You don't want people to restart your application like Windows, do you?

The CPU takes more time to get the instructions from the memory and hence in such cases, you find tasks like switching between the applications too slow. The garbage collector undoubtedly manages the memory in terms of managed code. It reclaims the unused memory to ensure good performance. However, the Common Language Runtime (CLR) is loaded for each and every .Net process and hence takes a big toll on the memory cost.

Each process follows a certain plug and play code that may contain irrelevant code that affects the memory usage in a way. And also, the garbage collector would not release objects that have any references. Hence, we should assist the garbage collector by clearing unused objects and reduce the memory consumption.

The .Net framework has a strong design that inhibits consistently performing good applications. Unlike C++, you don't have to worry to a great extent. But you should be extremely watchful to solve some memory bottlenecks.

Often the following reasons (I have listed a few that strike my mind) attribute to the large usage of memory.

  • Large number of objects being used
  • Holding them longer than required
  • Using reference types exhaustively
  • Using code that blocks memory- using DataTable to prepare xml, instead of using XMLTextWriter
  • Choosing to do resource intensive operations in the application rather that other places, such as the database.
  • Throwing exceptions and
  • Seldom use of finally blocks.
  • Running ASP.NET Applications with debug="true"

Remember the iterators concept in C#. For, Foreach and Yield all show you how exactly the memory has to be used. They move through your input collection and allow you to keep only the current instance at any point in time.

.Net applications do not take care of the memory usage like the unmanaged applications do and therefore considering an upgrade of hardware to support poor applications is no longer a choice.

Subramanian Ramaswamy and Vance Morrison clearly point out that "The first step in reducing the memory consumption of your application is to understand how much of it is currently used" in their article on memory usage auditing for .net applications.

Your first stop to find out how much memory is consumed would be the Task Manager. But wait a minute: There is something that you should know before gathering information from the Task Manager's memory consumption statistics. Tim Anderson's post titled "How much memory does my .NET application use?" gives a clear picture of how to use the Task Manager to gather this information.

Using the tools below, you can quickly gauge the loop holes of your application and also drill down to the faulty code.

CLR Profiler V4 - The CLR profiler from Microsoft.

Ants Profiler - The code profiler from RedGate.

DotTrace Profiler - The profiler from JetBrains.

There is one that I have yet to evaluate: EQATEC Profiler, which probably looks the simplest in its description. It instruments your application in a separate step that enables the application to collect its profiling information, and the EQATEC profiler later just displays the timing data at the end. This gives you a greater flexibility of running your applications independently of the profiler. The instrument applications could be run on some other servers and profiled separately. That definitely saves time and can be done parallel to the development cycles to save precious time.

Also note that there are some pretty good tools to take memory dumps on your servers to capture some critical data, like the ADPlus tool. You would then need a Memory dump analyzer tool like Windbg to analyze it.

Visual Studio offers its own rich profiling suite. You need the Visual Studio Team Suite to profile your applications. This is much better than the performance tools that are available in the 2005 version. You can find the performance wizard in the Analyze Menu. I would not be giving you any new insights which aren't in the MSDN article titled "Find Application Bottlenecks with Visual Studio Profiler".

The bitter truth is that the problem of memory usages are not the kind of "Object reference not set to an instance of an object" error that comes up during certain workflows in your application. If not profiled, they can appear only if you encounter them. It can crash your applications, make your application hung or make the entire system unstable.

On a lighter note, memory issues make you bald. So ensure that you control memory usage in the early stage of the development.

Remember, optimization is a good art.

References:

How much memory does my .NET application use?

Find Application Bottlenecks with Visual Studio Profiler



About the Author

Srinath M S

I would love to leave a footprint in this flat world

Related Articles

Comments

  • Useful

    Posted by Promotional Engine on 04/02/2011 12:28am

    Thanks, These days memory issue is being more important (until we have special RAMs with huge capacity!)... but processing considered more than memory usage.

    Reply
Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • Download the Information Governance Survey Benchmark Report to gain insights that can help you further establish business value in your Records and Information Management (RIM) program and across your entire organization. Discover how your peers in the industry are dealing with this evolving information lifecycle management environment and uncover key insights such as: 87% of organizations surveyed have a RIM program in place 8% measure compliance 64% cannot get employees to "let go" of information for …

  • With JRebel, developers get to see their code changes immediately, fine-tune their code with incremental changes, debug, explore and deploy their code with ease (both locally and remotely), and ultimately spend more time coding instead of waiting for the dreaded application redeploy to finish. Every time a developer tests a code change it takes minutes to build and deploy the application. JRebel keeps the app server running at all times, so testing is instantaneous and interactive.

Most Popular Programming Stories

More for Developers

Latest Developer Headlines

RSS Feeds