Distributed Cache
with .NET
Microservices
Ajay Jajoo – Senior Software Consultant
Sujit Meshram – Software Consultant
Lack of etiquette and manners is a huge turn off.
KnolX Etiquettes
 Punctuality
Join the session 5 minutes prior to the session start time. We start on
time and conclude on time!
 Feedback
Make sure to submit a constructive feedback for all sessions as it is very
helpful for the presenter.
 Silent Mode
Keep your mobile devices in silent mode, feel free to move out of session
in case you need to attend an urgent call.
 Avoid Disturbance
Avoid unwanted chit chat during the session.
1. Introduction to Caching
2. Advantages and Real Life Examples of
Caching
3. Types of Cache
4. What is Distributed Cache
5. Caching Patterns/Policies
6. Caching Eviction Techniques
7. What is Redis?
8. Benefits of Using Redis in Microservices
9. Setting Up Redis with .NET
10.Implementation and Demo
11.Conclusion
Introduction to Caching
 Definition: Caching is a technique to temporarily store frequently
accessed data in memory for quicker access.
 Importance:
− Performance: Faster data retrieval improves application response
time.
− Reduced Database Load: Fewer database queries reduce server
load.
− Enhanced User Experience: Users experience faster load times.
02
Real-Life Example: Instagram/Twitter
 Scenario: When a user visits another user's profile page on Instagram or
Twitter, they see the user's details, recent posts, follower count, etc.
 Without Caching: Every time a profile page is loaded, the app needs to
query the database multiple times to fetch this information. This can slow
down the app and increase the load on the database.
 With Caching: Frequently accessed data like user profiles, follower
counts, and recent posts are stored in a cache. This means the data can
be quickly retrieved from the cache instead of querying the database
each time.
Real-Life Example: Instagram/Twitter
 Benefits:
− Performance Improvement: The profile page loads much faster
since data is retrieved from the in-memory cache.
− Reduced Database Load: Fewer database queries lower the load on
the database and prevent slowdowns.
− Enhanced User Experience: Users experience faster load times and
seamless navigation.
 Example in Action:
When a user views a popular tweet or Instagram post, the post details
are cached. Subsequent views retrieve data from the cache, ensuring
a swift user experience and maintaining database performance.
− By using caching, Instagram and Twitter handle millions of users
accessing the same content simultaneously, ensuring a smooth and
responsive experience for their users.
Real-Life Example: Instagram/Twitter
Real-Life Example: Instagram/Twitter
Importance in Industry
Scalability: Handle more users and data by adding more
servers.
High Availability: Ensure critical services stay online
(e.g., health services, stock markets, military).
Fault Tolerance: Maintain operations even if parts of the
system fail.
Real-World Examples
Health Services: Hospitals need constant access to
patient records. Distributed caching ensures these records
are always available.
Stock Markets: Stock prices need to be updated in real-
time without delays. Distributed caching helps in providing
fast and reliable access.
Military: Military systems need to be operational at all
times. Distributed caching ensures data is always
accessible, even in case of server failures.
03
Types of Cache
So, there are two types of caching .NET Core supports.
1.In-Memory Caching: Stores data in the memory of the web
server and it is mostly suitable for single-server applications
or development/testing environments.
2.Distributed Caching: Stores data in a shared cache
accessible by multiple servers, and its suitable for scalable,
high-availability applications and microservices architectures.
example: Redis, Memcached
04
Distributed Cache
A distributed cache is a type of cache where data is stored
across multiple servers or nodes. These nodes can be in the
same data center or spread across different data centers
around the world.
Benefits of Distributed Cache
 Scalability:
− Explanation: Add more servers to handle more data or traffic.
− Example: If a website gets more visitors, more cache servers can
be added.
 High Availability:
− Explanation: Data remains available even if some servers fail.
− Example: If a server goes down, other servers can still serve the data.
 Fault Tolerance:
− Explanation: The system continues to function despite server failures.
− Example: If a server crashes, data is still accessible from other servers.
05
Caching Patterns/Policies
Some common caching patterns:
 Cache-Aside
 Read-Through
 Write-Through
 Write-Back/Behind
 Write-Around
 Refresh-Ahead
 Cache-Aside
- How It Works: The application checks the cache first. If data isn't there, it fetches
from the database, stores it in the cache, and returns it to the user.
- Best For: Read-heavy data that doesn't update often.
 Read-Through
- How It Works: The cache checks the database if data isn't found, stores it in the
cache, and returns it to the user.
- Best For: Data that benefits from lazy loading.
 Write-Through
- How It Works: Data is written to the cache and database at the same time, keeping
both in sync.
- Best For: Write-heavy applications needing data consistency.
 Write-Back/Behind
- How It Works: Data is written to the cache first and queued for database updates later,
allowing faster writes but with a risk of data loss if the cache fails.
- Best For: Applications needing fast writes with some risk tolerance.
 Write-Around
- How It Works: Data is written directly to the database. On read, it fetches from the
database and stores it in the cache.
- Best For: Applications that don’t re-read recent data often.
 Refresh-Ahead
- How It Works: The cache refreshes data before it expires, ensuring the latest data is
available when needed.
- Best For: Reducing latency by keeping the cache up-to-date.
06
Caching Eviction Techniques
So, when the cache is full, these techniques decide which items to remove:
 Least Recently Used (LRU): Removes items that haven't been used for a long time.
− How It Works: Keeps track of when items were last used. When the cache is full, it
removes the item that was used the longest time ago.
 Least Frequently Used (LFU): Removes items that have been used the least number
of times.
− How It Works: Keeps a count of how often each item is used. When the cache is full,
it removes the item with the lowest usage count.
 First In First Out (FIFO): Removes items in the order they were added, with the oldest
item being removed first.
− How It Works: Maintains a queue of items based on when they were added. When
the cache is full, it removes the item that was added earliest.
 Last In First Out (LIFO): Removes the most recently added items first.
- How It Works: Similar to a stack, the last item added is the first to be removed
when the cache is full.
 Most Recently Used (MRU): Removes the items that have been used most
recently.
- How It Works: Keeps track of when items were last used. When the cache is full, it
removes the item that was used most recently.
07
What is Redis?
 Redis (Remote Dictionary Server) is an open-source, in-
memory data structure store that can be used as a distributed
cache, database, and message broker.
 Basically, it is used to store the frequently used and some
static data inside the cache and use and reserve that as per
user requirement.
 There are many data structures present in the Redis which we
are able to use like List, Set, Hashing, Stream, and many
more to store the data.
Key Features
In-Memory Storage: Data is stored in RAM for fast access.
Data Structures: Supports strings, lists, sets, sorted sets, hashes,
bitmaps, and more.
Persistence Options: Data can be saved to disk to prevent data
loss.
High Availability: Redis Sentinel provides high availability through
monitoring and failover.
Scalability: Redis Cluster allows horizontal scaling by distributing
data across multiple nodes.
08
Benefits of Using Redis in Microservices
 Performance:
− Speed: In-memory operations are much faster than disk-based operations.
− Low Latency: Immediate response time for read/write operations.
 Scalability:
− Horizontal Scaling: Redis Cluster can distribute data across multiple servers.
 Decoupling:
− Reduced Dependency: Microservices can fetch data from cache instead of hitting
the database every time.
 Availability:
− Failover: Redis Sentinel ensures availability by automatic failover.
− Replication: Data can be replicated to multiple Redis nodes.
09
Setting Up Redis with .NET
 Step 1. Download the Redis Server using the following URL.
https://github.com/microsoftarchive/redis/releases/tag/win-3.0.504
 Step 2. Extract the zip file and then open the Redis Server and Redis CLI.
10
Connection to Redis Cache
 Libraries: Use StackExchange.Redis library for .NET.
 Configuration: Sample configuration code for connecting to a Redis
instance
Sample Code :
var redis = ConnectionMultiplexer.Connect("localhost");
IDatabase db = redis.GetDatabase();
Example: Basic operations (SET, GET) in .NET.
Implementing Cache in .Net Microservices
Cache Aside Pattern: The pattern where the
application first checks the cache before querying the
database.
Code Example: Simple example of implementing
cache aside pattern.
Sample Code
string GetDataFromCache(string key)
{
string value = db.StringGet(key);
if (value == null)
{
value = GetDataFromDatabase(key);
db.StringSet(key, value);
}
return value;
}
Implementation
 Step 1. Create the .NET Core API Web Application
 Step 2. Install the following NuGet Packages, which need step by steps in our application
− Microsoft.EntityFrameworkCore
− Microsoft.EntityFrameworkCore.Design
− Microsoft.EntityFrameworkCore.SqlServer
− Microsoft.EntityFrameworkCore.Tools
− Swashbuckle.AspNetCore
− StackExchange.Redis
11
Conclusion
Redis as a distributed cache significantly enhances the performance and
scalability of .NET microservices by providing fast, in-memory data
access.
It reduces database load, improves response times, and ensures high
availability with features like replication and failover.
By integrating Redis with .NET Core, developers can build efficient,
scalable, and reliable microservices. To maximize the benefits, it's crucial
to follow best practices like proper cache invalidation, data expiration, and
monitoring.
Distributed Cache with dot microservices
Distributed Cache with dot microservices

Distributed Cache with dot microservices

  • 1.
    Distributed Cache with .NET Microservices AjayJajoo – Senior Software Consultant Sujit Meshram – Software Consultant
  • 2.
    Lack of etiquetteand manners is a huge turn off. KnolX Etiquettes  Punctuality Join the session 5 minutes prior to the session start time. We start on time and conclude on time!  Feedback Make sure to submit a constructive feedback for all sessions as it is very helpful for the presenter.  Silent Mode Keep your mobile devices in silent mode, feel free to move out of session in case you need to attend an urgent call.  Avoid Disturbance Avoid unwanted chit chat during the session.
  • 3.
    1. Introduction toCaching 2. Advantages and Real Life Examples of Caching 3. Types of Cache 4. What is Distributed Cache 5. Caching Patterns/Policies 6. Caching Eviction Techniques 7. What is Redis? 8. Benefits of Using Redis in Microservices 9. Setting Up Redis with .NET 10.Implementation and Demo 11.Conclusion
  • 5.
    Introduction to Caching Definition: Caching is a technique to temporarily store frequently accessed data in memory for quicker access.  Importance: − Performance: Faster data retrieval improves application response time. − Reduced Database Load: Fewer database queries reduce server load. − Enhanced User Experience: Users experience faster load times.
  • 6.
  • 7.
    Real-Life Example: Instagram/Twitter Scenario: When a user visits another user's profile page on Instagram or Twitter, they see the user's details, recent posts, follower count, etc.  Without Caching: Every time a profile page is loaded, the app needs to query the database multiple times to fetch this information. This can slow down the app and increase the load on the database.  With Caching: Frequently accessed data like user profiles, follower counts, and recent posts are stored in a cache. This means the data can be quickly retrieved from the cache instead of querying the database each time.
  • 8.
    Real-Life Example: Instagram/Twitter Benefits: − Performance Improvement: The profile page loads much faster since data is retrieved from the in-memory cache. − Reduced Database Load: Fewer database queries lower the load on the database and prevent slowdowns. − Enhanced User Experience: Users experience faster load times and seamless navigation.
  • 9.
     Example inAction: When a user views a popular tweet or Instagram post, the post details are cached. Subsequent views retrieve data from the cache, ensuring a swift user experience and maintaining database performance. − By using caching, Instagram and Twitter handle millions of users accessing the same content simultaneously, ensuring a smooth and responsive experience for their users.
  • 10.
  • 11.
  • 12.
    Importance in Industry Scalability:Handle more users and data by adding more servers. High Availability: Ensure critical services stay online (e.g., health services, stock markets, military). Fault Tolerance: Maintain operations even if parts of the system fail.
  • 13.
    Real-World Examples Health Services:Hospitals need constant access to patient records. Distributed caching ensures these records are always available. Stock Markets: Stock prices need to be updated in real- time without delays. Distributed caching helps in providing fast and reliable access. Military: Military systems need to be operational at all times. Distributed caching ensures data is always accessible, even in case of server failures.
  • 14.
  • 15.
    Types of Cache So,there are two types of caching .NET Core supports. 1.In-Memory Caching: Stores data in the memory of the web server and it is mostly suitable for single-server applications or development/testing environments. 2.Distributed Caching: Stores data in a shared cache accessible by multiple servers, and its suitable for scalable, high-availability applications and microservices architectures. example: Redis, Memcached
  • 16.
  • 17.
    Distributed Cache A distributedcache is a type of cache where data is stored across multiple servers or nodes. These nodes can be in the same data center or spread across different data centers around the world.
  • 18.
    Benefits of DistributedCache  Scalability: − Explanation: Add more servers to handle more data or traffic. − Example: If a website gets more visitors, more cache servers can be added.  High Availability: − Explanation: Data remains available even if some servers fail. − Example: If a server goes down, other servers can still serve the data.  Fault Tolerance: − Explanation: The system continues to function despite server failures. − Example: If a server crashes, data is still accessible from other servers.
  • 19.
  • 20.
    Caching Patterns/Policies Some commoncaching patterns:  Cache-Aside  Read-Through  Write-Through  Write-Back/Behind  Write-Around  Refresh-Ahead
  • 21.
     Cache-Aside - HowIt Works: The application checks the cache first. If data isn't there, it fetches from the database, stores it in the cache, and returns it to the user. - Best For: Read-heavy data that doesn't update often.  Read-Through - How It Works: The cache checks the database if data isn't found, stores it in the cache, and returns it to the user. - Best For: Data that benefits from lazy loading.  Write-Through - How It Works: Data is written to the cache and database at the same time, keeping both in sync. - Best For: Write-heavy applications needing data consistency.
  • 22.
     Write-Back/Behind - HowIt Works: Data is written to the cache first and queued for database updates later, allowing faster writes but with a risk of data loss if the cache fails. - Best For: Applications needing fast writes with some risk tolerance.  Write-Around - How It Works: Data is written directly to the database. On read, it fetches from the database and stores it in the cache. - Best For: Applications that don’t re-read recent data often.  Refresh-Ahead - How It Works: The cache refreshes data before it expires, ensuring the latest data is available when needed. - Best For: Reducing latency by keeping the cache up-to-date.
  • 23.
  • 24.
    Caching Eviction Techniques So,when the cache is full, these techniques decide which items to remove:  Least Recently Used (LRU): Removes items that haven't been used for a long time. − How It Works: Keeps track of when items were last used. When the cache is full, it removes the item that was used the longest time ago.  Least Frequently Used (LFU): Removes items that have been used the least number of times. − How It Works: Keeps a count of how often each item is used. When the cache is full, it removes the item with the lowest usage count.  First In First Out (FIFO): Removes items in the order they were added, with the oldest item being removed first. − How It Works: Maintains a queue of items based on when they were added. When the cache is full, it removes the item that was added earliest.
  • 25.
     Last InFirst Out (LIFO): Removes the most recently added items first. - How It Works: Similar to a stack, the last item added is the first to be removed when the cache is full.  Most Recently Used (MRU): Removes the items that have been used most recently. - How It Works: Keeps track of when items were last used. When the cache is full, it removes the item that was used most recently.
  • 26.
  • 27.
    What is Redis? Redis (Remote Dictionary Server) is an open-source, in- memory data structure store that can be used as a distributed cache, database, and message broker.  Basically, it is used to store the frequently used and some static data inside the cache and use and reserve that as per user requirement.  There are many data structures present in the Redis which we are able to use like List, Set, Hashing, Stream, and many more to store the data.
  • 28.
    Key Features In-Memory Storage:Data is stored in RAM for fast access. Data Structures: Supports strings, lists, sets, sorted sets, hashes, bitmaps, and more. Persistence Options: Data can be saved to disk to prevent data loss. High Availability: Redis Sentinel provides high availability through monitoring and failover. Scalability: Redis Cluster allows horizontal scaling by distributing data across multiple nodes.
  • 29.
  • 30.
    Benefits of UsingRedis in Microservices  Performance: − Speed: In-memory operations are much faster than disk-based operations. − Low Latency: Immediate response time for read/write operations.  Scalability: − Horizontal Scaling: Redis Cluster can distribute data across multiple servers.  Decoupling: − Reduced Dependency: Microservices can fetch data from cache instead of hitting the database every time.  Availability: − Failover: Redis Sentinel ensures availability by automatic failover. − Replication: Data can be replicated to multiple Redis nodes.
  • 31.
  • 32.
    Setting Up Rediswith .NET  Step 1. Download the Redis Server using the following URL. https://github.com/microsoftarchive/redis/releases/tag/win-3.0.504  Step 2. Extract the zip file and then open the Redis Server and Redis CLI.
  • 33.
  • 34.
    Connection to RedisCache  Libraries: Use StackExchange.Redis library for .NET.  Configuration: Sample configuration code for connecting to a Redis instance Sample Code : var redis = ConnectionMultiplexer.Connect("localhost"); IDatabase db = redis.GetDatabase(); Example: Basic operations (SET, GET) in .NET.
  • 35.
    Implementing Cache in.Net Microservices Cache Aside Pattern: The pattern where the application first checks the cache before querying the database. Code Example: Simple example of implementing cache aside pattern.
  • 36.
    Sample Code string GetDataFromCache(stringkey) { string value = db.StringGet(key); if (value == null) { value = GetDataFromDatabase(key); db.StringSet(key, value); } return value; }
  • 37.
    Implementation  Step 1.Create the .NET Core API Web Application  Step 2. Install the following NuGet Packages, which need step by steps in our application − Microsoft.EntityFrameworkCore − Microsoft.EntityFrameworkCore.Design − Microsoft.EntityFrameworkCore.SqlServer − Microsoft.EntityFrameworkCore.Tools − Swashbuckle.AspNetCore − StackExchange.Redis
  • 38.
  • 39.
    Conclusion Redis as adistributed cache significantly enhances the performance and scalability of .NET microservices by providing fast, in-memory data access. It reduces database load, improves response times, and ensures high availability with features like replication and failover. By integrating Redis with .NET Core, developers can build efficient, scalable, and reliable microservices. To maximize the benefits, it's crucial to follow best practices like proper cache invalidation, data expiration, and monitoring.