Introduction to Caching in ASP.NET

ASP.NET Caching Examples

Caching is a useful technique to improve the performance of an application. Caching involves storing data in memory for quick access, which avoids expensive database queries, thus speeding up applications. This .NET programming tutorial talks about caching with reference to ASP.NET.

What is Caching?

Caching is a method that stores the page output or application data between HTTP requests in memory so that future requests for the same piece of data or page are obtained from memory. It enhances application speed by allowing for quicker page rendering and fewer server resources. .NET developers can use caching to create scalable, high-performance apps.

A cache is a location usually present in the memory that facilitates high-speed data access. Caches store this subset of data to provide quick access to it and avoid accessing slower data stores for the same piece of data.

Read: Best C# IDEs and Code Editors for .NET Developers

What are the Benefits of Caching?

There are many benefits to using caching and reasons why programmers use caching in their software development. A few are:

Application Performance

Caching can improve the responsiveness and throughput of an application by reducing latency to the application, improving scalability, and reducing resource consumption.

Storing commonly used items in cache can be helpful in improving user experience or reducing the load on resource-intensive operations such as accessing a database. You can use caching for storing session state, user profiles, and other pieces of data that need frequent access. Using caching for storing expensive calculations also improves overall application performance by avoiding repeated calculations and enhances user experience.

Reduce Calls to Backend Services and Databases

Backend services may be located remotely or may have limited capacity. Reducing calls provides better application performance and reduces the load on back-end systems while improving resiliency as they fail less often without being overwhelmed with requests.

Ease the Load on Shared Resources

External resources such as databases should be empathetic with other consumers (i.e., applications) that are using those resources, so they do not negatively impact their performance or cause them to fail by overwhelming them with large amounts of work in short periods of time. Because cached data is stored locally in memory instead of making external requests for it repeatedly, there is less network traffic and less load on shared resources like databases.

Read: Detecting and Preventing Memory Leaks in .NET

What is Response Caching?

Response caching, an advanced version of output caching, is used to cache web server responses by leveraging cache-related headers. It instructs web browsers to cache data by including cache-related headers in HTTP responses. These headers describe how to cache replies for all or a subset of requests.

Response caching, unlike output caching, does not keep responses in the web server’s memory. Response caching helps reduce the number and latency of client requests to the webserver. Although response caching utilizes memory to store data by default, it’s worth noting that you may enable alternative storage providers.

Response caching is a server-side caching technique. The server sends the response to the client with a set of headers to instruct the client on caching behavior. The client caches the response and uses it when a subsequent request is made for the same resource.

If you are using ASP.NET, you should add response caching services to your project using the following code in the Program.cs file:

services.AddResponseCaching();

Next, you should enable response caching using the following code:

app.UseResponseCaching();

You can enable response caching in ASP.NET Core by configuring ResponseCachingMiddleware in your Startup class:

public void ConfigureServices(IServiceCollection services)
{
    services.AddResponseCaching();
    services.AddMvc();
}
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
    //Other code
    app.UseResponseCaching();   
}

When configuring response caching, you can specify the max size of the response cache and the largest cacheable size of the response body. You can also specify if responses should be cached in case-sensitive paths. The following code snippet illustrates how you can configure response caching with these parameters:

services.AddResponseCaching(options =>
{
     options.UseCaseSensitivePaths = true;
     options.MaximumBodySize = 1024;
});

What is In-Memory Caching?

You can also take advantage of in-memory caching to improve the performance and scalability of an application. To leverage in-memory caching in ASP.NET 6, you should take advantage of the IMemoryCache interface. The IMemoryCache interface looks like this:

public interface IMemoryCache : IDisposable
{
    bool TryGetValue(object key, out object value);
    ICacheEntry CreateEntry(object key);
    void Remove(object key);
}

You can add in-memory cache services to the request processing pipeline using the following code snippet in the Program.cs file in ASP.NET:

services.AddMemoryCache();

Once you have registered in-memory caching services with the built-in IoC container, you can use in-memory caching in your controllers as shown in the code snippet given below:

[ApiController]
[Route("[controller]")]
public class MyController : ControllerBase
{
  private IMemoryCache cache;
  public MyController(IMemoryCache cache)
  {
     this.cache = cache;
  }
}

Read: Implementing In-Memory Caching in .NET

What is Distributed Caching?

Distributed caching is a common way to cache data in a large, cloud-based application. It is useful when you need fast access to your data, but the volume of that data prevents it from being stored on one server.

This type of caching stores duplicates of your data across multiple servers—that is, it creates replicas of your data across multiple servers. The replicas are stored in memory on each server (rather than being persisted to disk), which means they can be lost if the server crashes or restarts.

A distributed cache increases performance because it reduces the number of requests sent over the network. One example of distributed caching is SQL Server’s Distributed Cache service (formerly known as AppFabric).

What is the Cache-Aside Pattern?

You can load data from data storage into a cache on demand. This technique is known as the “cache-aside pattern.” It can enhance performance and ensure data consistency between the cache and the underlying data storage. This pattern states that you must first check your cache before you can retrieve an item from the data store. You can use the item if it is already in your cache. You can use the cache if the item is not in the cache.

The cache-aside pattern doesn’t scale as well as response caching because it can’t share its cached data across multiple machines in a server farm or web garden. The cache-aside pattern also doesn’t scale as well as distributed caching because it stores all its cached data on one machine and not on multiple machines in different locations around the world (wherever your app instances are running). The most scalable form of this pattern is called database caching wherein you store your cached data inside your database itself along with all other data so that any application instance can retrieve items from a single place regardless of which machine is executing its requests at any given point in time.

Cache Invalidation

Developers need to handle stale data and write code to invalidate cache entries. As an example, if you are caching the full list of movie genres in memory, and someone inserts a new genre into the database using another application instance, then your cached copy of the list is no longer up to date. Eventually, your app will request the updated list from the database but until then it’s serving outdated information.

Cache Expiration

You need to implement cache timeouts or expirations for each piece of data you cache. If a user updates a record after it’s already been cached (for example, if they change their shipping address), then eventually your app will read that updated information from the database but until then it’s serving outdated information. This is essentially an extension of stale data issues.

Read: Productivity Tools for .NET Developers

Final Thoughts on Caching in ASP.NET and .NET

There are different types of caching patterns such as the Cache-Aside Pattern, the Write-Through Pattern, the Write-Behind Pattern, and so forth. In this ASP.NET programming tutorial, we have examined the Cache-Aside Pattern. We will discuss the other types of caching patterns and best practices of caching in a future article on CodeGuru.

Read more ASP.NET development and programming tutorials.

Joydip Kanjilal
Joydip Kanjilal
A Microsoft Most Valuable Professional in ASP.NET, Speaker, and Author of several books and articles. More than 25 years of experience in IT with more than 18 years in Microsoft .NET and its related technologies. He was selected as a Community Credit Winner at http://www.community-credit.com several times. He has authored 8 books and more than 500 articles in some of the most reputed sites worldwide including MSDN, Info World, CodeMag, Tech Beacon, Tech Target, Developer, CodeGuru, and more.

More by Author

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Must Read