Leetcode 146. LRU caching mechanism

preface Cache is a technology to improve data...

preface

Cache is a technology to improve data reading performance. There are differences in reading data between CPU and main memory in the computer. There is CPU cache between CPU and main memory, and there is memory cache in memory and hard disk. When the main memory capacity is much larger than the CPU cache, or the disk capacity is much larger than the main memory, which data should be cleaned up and which data should be retained need to be determined by the cache elimination strategy. There are three common strategies: first in, first out (FIFO), least frequently used (LFU) and least recently used (LRU).

LRU description

Design and implement a   LRU (least recently used) caching mechanism. Implement LRUCache class:

  • LRUCache(int capacity) takes a positive integer as the capacity   Capacity initialize LRU cache
  • int get(int key) if the keyword key exists in the cache, the value of the keyword is returned; otherwise, - 1 is returned.
  • void put(int key, int value)   If the keyword already exists, change its data value; If the keyword does not exist, insert the group keyword value. When the cache capacity reaches the maximum, it should delete the longest unused data value before writing new data, so as to make room for new data values.

Problem solving ideas hash table + two-way linked list

  • According to the characteristics of LRU, double linked list is selected.
  • Use the gut method to obtain data. If there is data, return the data and put the data at the head of the linked list.
  • Use the put method to store data. If the data exists, directly overwrite the new value; If the data does not exist, add a new value. The new values are placed at the head of the linked list. In addition, you also need to judge whether the cache exceeds the capacity. If so, delete the tail node of the linked list.
  • Because it is a single linked list, you need to traverse the linked list every time you get data or delete data. The time complexity is O(n). Here, hash is used to record the location of each data and reduce the time complexity of data access to O(1).
class LRUCache { class DLinkedNode{ int key; int value; DLinkedNode prev; DLinkedNode next; public DLinkedNode() {} public DLinkedNode(int key, int value) { this.key = key; this.value = value; } } private int size; private int capacity; private DLinkedNode head; private DLinkedNode tail; private Map<Integer,DLinkedNode> cache = new HashMap<>(); public LRUCache(int capacity) { this.size = 0; this.capacity = capacity; head = new DLinkedNode(); tail = new DLinkedNode(); head.next = tail; tail.prev = head; } public int get(int key) { DLinkedNode node = cache.get(key); if (node == null) { return -1; } //Find and move to first place moveToHead(node); return node.value; } public void put(int key, int value) { DLinkedNode node = cache.get(key); if (node == null) { //Create a new node if it does not exist DLinkedNode newNode = new DLinkedNode(key,value); cache.put(key,newNode); addToHead(newNode); size++; if (size > capacity) { //Capacity exceeded, remove last node DLinkedNode tail = removeTail(); cache.remove(tail.key); size--; } } else { //key exists, overwrites value, and moves to the head if (node.value != value) { node.value = value; } moveToHead(node); } } private DLinkedNode removeTail() { DLinkedNode node = tail.prev; removeNode(node); return node; } private DLinkedNode removeNode(DLinkedNode node) { node.next.prev = node.prev; node.prev.next = node.next; return node; } private void moveToHead(DLinkedNode node) { removeNode(node); addToHead(node); } private void addToHead(DLinkedNode node) { node.prev = head; node.next = head.next; head.next.prev = node; head.next = node; } }

reference resources

LRU Wikipedia Geek time - Wang Zheng - how to implement LRU cache elimination algorithm?

7 September 2021, 18:22 | Views: 7043

Add new comment

For adding a comment, please log in
or create account

0 comments