Catching Strategies

author

Bethwel Kiptoo

15 July, 2024

UX Design

Caching Strategy

A cache is hardware or software that is used to store something, usually data, temporarily in a computing environment. Cache management strategies are used to determine which data to keep in the cache and which data to remove when the cache is full and needs to make room for new data.

How does a cache Work?

Cache-hit- if the clients attempts to access the data and the address is found in the address location. Cache-miss – the data is not found in the cache and the information pulled from the main memory and stored in the cache.

Least Recently Used(LRU)

Least Recently Used (LRU) is a caching algorithm in which the least recently used cache block is removed whenever the cache is overflowed. Cache-size Put get Recently accessed

How LRU Works

How It works?
Cache-size(memory storage) Request (get/put) Frequency Recently accessed

2. Least Frequently Used

Least Frequently Used (LFU) is a caching algorithm in which the least frequently used cache block is removed whenever the cache is overflowed. Put get Frequency Cache-size Recently accessed

How LFU Works

How It works? Cache-size(memory storage) Request (get/put) Frequency Recently accessed

3. FIFO

1. Definition:
FIFO (First-In, First-Out) cache is a cache management policy where the first item added to the cache is the first one to be removed when the cache reaches its maximum capacity. It operates on the principle of queue data structure.

HOW FIFO works

Data Structure:
Queue: A simple queue (or list) is used to maintain the order of insertion. The oldest items are at the front, and the newest items are at the back. Hash Map (Optional): Often used alongside a queue to map keys to values for efficient access

Operation of FIFO

Insert (key, value): Add the new item to the back of the queue. If the cache is full, remove the item at the front of the queue (the oldest item) before inserting the new item. Time Complexity: O(1) for both insertion and eviction. Access (key): If the item is in the cache, it can be accessed directly (if using a hash map for storage). Does not change the order of items in the queue. Time Complexity: O(1) for access using a hash map.

4. LIFO

LIFO (Last-In, First-Out) cache is a cache management policy where the most recently added item is the first one to be removed when the cache reaches its maximum capacity. It operates on the principle of stack data structure.

Operation

Data Structure:
Insert (key, value): Add the new item to the back of the queue. If the cache is full, remove the recent item in the queue. Time Complexity: O(1) for both insertion and eviction.
Access (key): If the item is in the cache, it can be accessed directly (if using a hash map for storage). Does not change the order of items in the queue. Time Complexity: O(1) for access using a hash map.

5. Random Replacement

cache management policy where items are removed randomly when the cache reaches its maximum capacity. It operates on the principle of stack data structure.

Applications

Web Browsers: Use caching to store frequently accessed resources (like images, CSS files) to speed up page load times. Content Delivery Networks (CDNs): Cache web content closer to the user's location to reduce latency and improve load times. Database Query Caching: Databases cache query results to serve frequently requested data without re-executing the queries. API Gateways: Cache API responses to reduce the number of calls to backend services, improving response times and reducing backend load.

Advantages and Disadvantages

Advantages:
Improved Performance and Reduced Latency Reduced Load on Primary Data Sources: load on primary data sources such as databases and APIs.
Scalability: applications can manage higher loads and scale it effectively.

disadvantages:
Cache Miss Penalty: can lead to reduced performance.
Memory and Storage Overhead: Increased costs of additional storage.

Leave a comment