Home > Regain Control of Cache Management With Output Caching and .NET 7
Mohamed Ben Slimen
7 November 2022
Lire cet article en Français

Regain Control of Cache Management With Output Caching and .NET 7

Regain Control of Cache Management With Output Caching and .NET 7

In an increasingly fast-paced world, a system’s or application’s ability to provide quick and consistent responses is critical in determining its success and performance.

Developers have various resources to help them in their “quest for the top,” such as best practices for writing code and techniques for improving data access requests. However, there are situations where the limitations of code optimization become apparent.

To help you understand this, let’s look at e-commerce sites. In sale periods, the product catalog display time becomes longer. The high volume of requests is usually to blame, but the fact that the system repeats the same routine for each request can also contribute. This is where a common technique, known as caching, comes into play.

Caching is the process of storing temporary data (or a part of it) on fast, highly available storage media so that requests for the same data can be processed faster in the future.

Different caching technologies can be used at different levels, such as caching the most-used data, caching the results of the most demanding calculations, or caching the output of HTTP responses.

In this post, we’ll show you a new ASP.NET Core resource you can use to optimize your output caches: the output caching middleware.

In earlier versions of ASP.NET Core, responses from endpoints were cached by the response caching middleware, which strictly followed the HTTP caching headers. The consumer tells the middleware what caching strategy to use with headers such as “Cache-Control” and “no-cache.”

Here you will find out why a new method, called output caching, was needed.


Configuring Output Caching for an ASP.NET Core Application


Installing and configuring output caching is pretty straightforward. You need to make sure you have the newest version of the .NET 7.0 software development kit (SDK). When this post was written, we were using version 7.0.0-rc2. If you don’t have it yet, you can download it by clicking this link: https://dotnet.microsoft.com/en-us/download/dotnet/7.0


Here is an example of a minimal API basic configuration with output caching:





  • AddOutputCache(): adds the inversion of control-level (IoC) services needed to manage caches.
  • UseOutputCache(): adds cache management to the ASP.Net Core middleware pipeline.

If necessary, you can replace the default values with values of your choice.



  • SizeLimit: this is the maximum size allowed by the cache storage. If this limit is reached, IOutputCacheStore will not allow any new entries.
  • DefaultExpirationTimeSpan: a cache’s default expiration time will be one minute. After that, the cache will be automatically destroyed.


Enabling Caching for Endpoints


Enable caching for all routes with a basic policy:


Disable caching for all routes with a basic policy:


The CacheOutput method is used to configure the caching mechanism for a specific route:


HTTP Request and Cache Variation


Caching With SetVaryByQuery


By default, the full path of the URL and the set of query parameters are used to determine if a query is unique.

If a query parameter value changes, or if a new query parameter is added, the cache is not reached, so a new entry is added.

To demonstrate, suppose we have the following route that returns a list of “events” and we want to filter them by city:


We will have a different entity cache for each of the queries below:

–   https://localhost:5269/\> GET /events

–   https://localhost:5269/\> GET /events?city=paris

–   https://localhost:5269/\> GET /events?city=lyon


To take the concept even further, we’re adding a “partnerToken” variable that will let us reward partners who show our events on their platform.

As the code shows, this value doesn’t affect the return result:


Without any prior configuration on our part, each route represents a unique key and is cached by the middleware, which may not be what we want since the “partnerToken” variable has no impact on the query result but will be used by other upstream middleware, such as the Rate Limiter.

To fix this and tell our cache how to handle certain parameters, we can override the “CacheOutput()” method and use the “OutputCachePolicyBuilder.VaryByQuery” extension method, as shown in the following example:


Caching With SetVaryByHeader


Like the query parameters, HTTP headers can also be used to vary our cache.


There are other variation types and mechanisms, such as “SetVaryByHost” and “SetVaryByRouteValue.”


HTTP Request and Cache Variation


We’ve seen how to configure caches at the level of our application’s different routes. This can introduce duplication into our code and make future maintenance and updates more complex.

You can group your cache settings into a policy that will be referenced or associated with one or more routes to avoid duplicating your cache settings across multiple routes.

There are two policy types:

  • Basic policies
  • Custom or named policies


Basic Policies


This is the default policy for all declared routes that do not have a specific cache configuration via the CacheOutput method or attribute.

These policies are not named. They are injected by the “AddBasePolicy” function.

The example below shows how to disable caching for all routes:


Custom Policies


Unlike basic policies, these policies are customized and require a name. They are not applied to all routes but can be linked to a route using the “CacheOutput” method/attribute.

The following example shows how to enable caching for a specific route:


Tags & Cache Purge


As stated at the beginning of this article, the elements that make up your cache have an expiration time of one minute. However, you can change this period depending on the scenario.

On the other hand, sometimes we need to invalidate or purge items from our cache, especially if the resource has been updated.

To do this, we need to:

  • “Tag” these resources. Tags are a way of grouping and labeling a set of items in our cache.
  • Use the “EvictByTagAsync” method to remove all cached items associated with that tag.

The example below shows how to purge the cache after adding or modifying an event:


Scaling and External Cache Store


By default, output caching uses an in-memory type local cache. Here is a piece of code from the default configuration:



You can find the full code here: OutputCacheServiceCollectionExtensions

However, there are some drawbacks to this quick and easy solution:

  • Using a local cache means that scaling is impossible because the cache is not shared between the different instances.
  • Memory consumption will affect the application’s performance.
  • If we plan on deploying the application in the cloud, this could result in high memory consumption costs.

Setting up our cache as an external service is one way to solve this.

In the example below, we’ll configure and use “azure-redis-cache” as an external cache store.

Please note that we have omitted some parts of the code for readability.


1 – Add a package reference to your solution



2- Implement “IDistributedCache” as “RedisOutputCacheStore.”


3- Save your cache using an “AddRedisOutputCache()” ServiceCollection extension method.


Replace “builder.Services.AddOutputCache()” in your “Program.cs” file with:


More About Output Caching


As you have read, output caching can be used in a variety of ways to boost application performance.

⚠️ It’s important to remember that adding a cache is meant to boost performance, not fix issues caused by poor design or implementation choices.

You can see more examples here: Github examples


Read all of the posts from this .NET series:

This posts should interest you
Leave a Reply

Receive the best of Cloud, DevOps and IT news.
Receive the best of Cloud, DevOps and IT news.