Caching in ASP.NET Core (In-Memory vs Distributed vs Redis)

Introduction

Performance problems in ASP.NET Core APIs often have one root cause: repeated database or external service calls.

When your API fetches the same data repeatedly—configuration, product lists, user permissions, reference tables—it wastes CPU cycles, increases database load, and slows down response time.

Caching solves this problem.

Caching stores frequently accessed data in fast-access storage so your application can reuse it instead of recalculating or refetching it. When implemented correctly, caching dramatically improves:

  • API response time
  • Scalability
  • Database performance
  • Infrastructure cost efficiency

In this guide, you will learn how caching works in ASP.NET Core, the difference between In-Memory Cache, Distributed Cache, and Redis, and how to choose the right approach for production systems.

This is a must-know topic for senior .NET developers and a frequent interview discussion area.


What Is Caching?

Caching stores frequently used data temporarily so future requests can access it quickly.

Instead of this flow:

Client → API → Database → API → Client

You get:

Client → API → Cache → Client

This reduces database calls and speeds up response times significantly.


Why Caching Is Critical in ASP.NET Core

Reduce Database Load

Databases are often the biggest performance bottleneck. Caching reduces repeated queries.

Improve API Response Time

Memory access is much faster than database access.

Handle Traffic Spikes

Caching helps APIs survive sudden load increases without crashing.

Reduce Infrastructure Costs

Less database traffic means fewer resources required.


Types of Caching in ASP.NET Core

ASP.NET Core supports three major caching approaches:

  1. In-Memory Caching
  2. Distributed Caching
  3. Redis Caching

Let’s break them down.


In-Memory Caching in ASP.NET Core

What Is In-Memory Cache?

In-memory caching stores data inside the application’s memory.

It is:

  • Fast
  • Simple
  • Lightweight
  • Suitable for single-server applications

How to Enable In-Memory Cache

Step 1: Register Service

builder.Services.AddMemoryCache();

Step 2: Inject IMemoryCache

public class ProductService
{
private readonly IMemoryCache _cache; public ProductService(IMemoryCache cache)
{
_cache = cache;
} public async Task<List<Product>> GetProductsAsync()
{
if (!_cache.TryGetValue("products", out List<Product> products))
{
products = await LoadProductsFromDatabase();

_cache.Set("products", products, TimeSpan.FromMinutes(10));
} return products;
}
}

When to Use In-Memory Cache

Use it when:

  • You have a single-instance application
  • Data does not need to be shared across servers
  • You cache lightweight data
  • You want simple implementation

Limitations of In-Memory Cache

  • Not shared across instances
  • Lost when application restarts
  • Not ideal for scaled environments

If you scale horizontally, each instance will have its own cache, which leads to inconsistency.


Distributed Caching in ASP.NET Core

What Is Distributed Cache?

Distributed caching stores cached data in an external storage system that multiple application instances can share.

ASP.NET Core provides IDistributedCache for this purpose.


How to Configure Distributed Cache

Example using SQL Server:

builder.Services.AddDistributedSqlServerCache(options =>
{
options.ConnectionString = "YourConnectionString";
options.SchemaName = "dbo";
options.TableName = "CacheTable";
});

Using IDistributedCache

public class UserService
{
private readonly IDistributedCache _cache; public UserService(IDistributedCache cache)
{
_cache = cache;
} public async Task<string> GetUserAsync(string userId)
{
var cachedData = await _cache.GetStringAsync(userId); if (cachedData != null)
return cachedData; var userData = await LoadFromDatabase(userId); await _cache.SetStringAsync(userId, userData,
new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5)
}); return userData;
}
}

When to Use Distributed Cache

Use it when:

  • You run multiple API instances
  • You deploy in load-balanced environments
  • You need shared caching
  • You want consistency across servers

Drawbacks

  • Slightly slower than in-memory
  • Requires external storage
  • Adds infrastructure complexity

Redis Caching in ASP.NET Core

What Is Redis?

Redis is a high-performance, in-memory data store used as a distributed cache.

It is:

  • Extremely fast
  • Scalable
  • Production-grade
  • Ideal for high-traffic systems

Why Redis Is Popular

Redis offers:

  • Sub-millisecond response time
  • High throughput
  • Data persistence options
  • Advanced caching features

For scalable APIs, Redis is often the best choice.


How to Configure Redis in ASP.NET Core

Install Package

Microsoft.Extensions.Caching.StackExchangeRedis

Register Redis Cache

builder.Services.AddStackExchangeRedisCache(options =>
{
options.Configuration = "localhost:6379";
options.InstanceName = "MyApp_";
});

When to Use Redis

Use Redis when:

  • You expect high traffic
  • You need distributed cache
  • You want high availability
  • You run cloud-based microservices

Redis is the preferred option for serious production systems.


Cache Expiration Strategies

Choosing the right expiration strategy is critical.

Absolute Expiration

Cache expires after a fixed time.

Best for:

  • Static reference data

Sliding Expiration

Cache resets expiration timer whenever accessed.

Best for:

  • Frequently used user sessions
  • Frequently accessed configuration

Cache Invalidation

Invalidation is harder than caching.

When data changes:

  • Remove cache entry
  • Update cache immediately
  • Use event-driven invalidation

Improper invalidation causes stale data issues.


Common Caching Mistakes to Avoid

❌ Caching Everything

Not all data should be cached. Sensitive or rapidly changing data may not be suitable.


❌ Forgetting Expiration

Never cache without expiration strategy.


❌ Caching Large Objects

Large objects increase memory pressure and network overhead.


❌ Ignoring Distributed Environment

In-memory cache fails in multi-instance systems.


Performance Comparison Overview

FeatureIn-MemoryDistributedRedis
SpeedFastestModerateVery Fast
Multi-Instance Support❌ No✅ Yes✅ Yes
Setup ComplexityLowMediumMedium
Production ScalabilityLowMediumHigh

Interview Tip: How to Explain Caching in ASP.NET Core

If asked:

“How do you improve API performance?”

Answer:

  1. Measure bottlenecks
  2. Apply in-memory cache for local optimization
  3. Use distributed cache for scaling
  4. Use Redis for high traffic
  5. Design proper expiration and invalidation

This demonstrates architectural maturity.


Real-World Use Cases for Caching

Cache:

  • Product catalog
  • Country/state lists
  • Permissions and roles
  • Configuration data
  • Frequently accessed reports

Do not cache:

  • Highly sensitive financial transactions
  • Rapidly changing stock data (unless carefully controlled)

Frequently Asked Questions (FAQ)

1️⃣ What is caching in ASP.NET Core?

Caching in ASP.NET Core is a technique that stores frequently accessed data temporarily in memory or distributed storage to reduce database calls and improve API performance. It helps applications respond faster and handle higher traffic efficiently.


2️⃣ What is the difference between In-Memory Cache and Distributed Cache in ASP.NET Core?

In-memory cache stores data inside the application’s memory and works only for a single instance. Distributed cache stores data in an external storage system, allowing multiple application instances to share cached data. Distributed cache is suitable for load-balanced environments.


3️⃣ When should I use Redis in ASP.NET Core?

You should use Redis when:

  • Your application runs on multiple servers
  • You expect high traffic
  • You need fast distributed caching
  • You want production-grade scalability

Redis provides high performance and shared cache across instances.


4️⃣ Is In-Memory Cache suitable for production applications?

In-memory cache is suitable for small or single-instance applications. However, it is not ideal for scalable cloud environments because each instance maintains its own cache, leading to inconsistency.


5️⃣ How does caching improve ASP.NET Core API performance?

Caching reduces repeated database and external service calls. By serving data directly from memory or Redis, response time decreases significantly and server load is reduced.


6️⃣ What is Absolute Expiration vs Sliding Expiration?

Absolute expiration removes cache after a fixed time period.
Sliding expiration resets the expiration timer every time the cached item is accessed.

Use absolute expiration for static data and sliding expiration for frequently accessed user data.


7️⃣ What is cache invalidation in ASP.NET Core?

Cache invalidation is the process of removing or updating cached data when the underlying data changes. Without proper invalidation, applications may serve outdated or stale data.


8️⃣ Is Redis faster than In-Memory Cache?

In-memory cache is slightly faster because it runs inside the application process. However, Redis is still extremely fast and supports distributed systems, making it better for scalable production environments.


9️⃣ What data should not be cached?

Avoid caching:

  • Highly sensitive financial data
  • Frequently changing transactional data
  • Real-time stock or live trading data
  • Very large objects

Improper caching can lead to stale or inconsistent information.


🔟 How do I choose the right caching strategy in ASP.NET Core?

Choose:

  • In-Memory Cache → Single instance, simple apps
  • Distributed Cache → Multi-instance applications
  • Redis → High traffic, scalable cloud environments

Always consider scalability, consistency, and performance requirements before choosing.


Conclusion

Caching in ASP.NET Core is not optional in scalable systems. It is a fundamental performance optimization strategy.

Use:

  • In-memory cache for simple apps
  • Distributed cache for multi-instance environments
  • Redis for high-performance, scalable systems

When implemented correctly, caching reduces database load, improves response time, and ensures your APIs handle traffic spikes gracefully.

Strong backend engineers don’t just optimize queries.
They design smart caching strategies.


Related Articles

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top