What is Cache
- Cache is a hardware or software component that Stores data to speed up the same future requests.
- As Cache sits closer to CPU, accessing data is very fast.
- But there is a limitation of capacity, we have to decide which entry to remove before allowing more entries to be added in the cache.
- The Problem with FIFO(First-in First-Out) is it can replace heavily used entry.
- Since cache is fast we have to perform get and set in constant time.
What is LRU Cache
- It is a cache which follows LRU eviction policy.
- LRU basically means items with the oldest access time.
Design of LRU Cache
Following are the constraints
- Lookup in the cache should be fast, which is in constant time O(1).
- Similar addition to cache should also be fast, that is constant time O(1).
- Evict items should follow the LRU policy.
Approach
So we will be using two data structures: Hashtable and doubly Linked List.
Doubly Linked List: We will have total n items in our cache. This list will hold all the items.
Any items can be added or removed in a doubly-linked list in constant time O(1).
HashTable: We will use it for faster lookup in the doubly-linked list to avoid the search of items in O(n) time.
Post a Comment