SoFunction
Updated on 2025-04-14

Detailed explanation of the code that implements LRU cache in Java

1. Basic ideas of LRU caching

The LRU cache is a finite size cache. Whenever the cache capacity reaches the upper limit, the system will automatically delete the least recently used cache item. LRU cache is often used in data storage, graphics processing, operating systems and other fields.

Key points of LRU cache:

  1. Use the most recent data: The cache will save the recently visited data.
  2. Eliminate the least used data: When the cache space is full, delete the least recently used item.
  3. Maintain access order: Usually the access order of cached items is maintained through linked lists.

2. The core steps of LRU cache implementation

  1. Cache capacity limit: The cache size is fixed. When the capacity reaches the upper limit, the minimum cache item needs to be deleted.
  2. Quick search and delete operations:useMapThe data structure provides fast search and deletion functions, and uses two-way linked lists to maintain the access order of elements.
  3. Operation sequence: Each time the cache is accessed, the element is moved to the head of the linked list, indicating that it is recently used. When the capacity exceeds, delete the elements at the end of the linked list.

3. Java implementation of LRU cache

We will useLinkedHashMapto implement LRU caching.LinkedHashMapMaintain the insertion order of elements, you can set themaccessOrderfortrueto ensure that elements are maintained in order of access.

IV. Implement code

import .*;
 
public class LRUCache<K, V> {
    private final int capacity;
    private final Map<K, V> cache;
    
    // Constructor, initialize capacity and create a LinkedHashMap    public LRUCache(int capacity) {
         = capacity;
        // LinkedHashMap allows the insertion order to be maintained through access order         = new LinkedHashMap<>(capacity, 0.75f, true);
    }
 
    // Get the value in the cache    public V get(K key) {
        if (!(key)) {
            return null;  // If this item is not in the cache, return null        }
        return (key);  // If this item is present in the cache, return its value and move it to the end (represents the most recent use)    }
 
    // Add elements to cache    public void put(K key, V value) {
        if (() >= capacity) {
            // If the cache is full, remove the least used element (i.e. the element at the head of the linked list)            Iterator<<K, V>> iterator = ().iterator();
            if (()) {
                ();
                ();
            }
        }
        (key, value);  // Put new elements into cache    }
 
    // Print the content in the cache    public void printCache() {
        (cache);
    }
 
    public static void main(String[] args) {
        LRUCache<Integer, String> lruCache = new LRUCache<>(3);
        
        // Add elements to the cache        (1, "A");
        (2, "B");
        (3, "C");
        
        // Print cached content        ();  // Output: {1=A, 2=B, 3=C} 
        // Access some cache items        (1);  // Visited 1        (4, "D");  // Insert new element, the capacity is full        
        // Print cached content        ();  // Output: {3=C, 1=A, 4=D} (2 removed, minimum use)        
        // Continue to access some cache items        (3);  // Visited 3        (5, "E");  // Insert new element        
        // Print cached content        ();  // Output: {1=A, 3=C, 5=E} (4 removed)    }
}

5. Code interpretation

  1. LRUCachekind

    • useLinkedHashMapTo save cached data. Its constructor usestrueAs the third parameteraccessOrderto ensure that caches are arranged in order of access.
    • getMethod is used to obtain data in cache. If the data exists, it will automatically move the data to the most recently used location (the end of the linked list).
    • putMethod is used to add data to the cache. If the cache is full, the least used data (header element of the linked list) is deleted.
  2. Cache capacity control

    • When the number of cached elements reaches the specified capacity,putThe method will remove elements in the header of the linked list through an iterator to ensure that the cache will not exceed the maximum capacity.
  3. printCachemethod

    • Used to print the contents of the current cache, help debug and view cache status.
  4. mainmethod

    • Demonstrate how to use LRU cache with examples.
    • Added multiple elements to the cache, then accessed some elements and viewed the remaining content in the cache.

6. How LRU cache works

  1. Cache initialization: When initializedLRUCacheSpecify the maximum capacity of the cache.
  2. Cache Add:passputMethods add elements to the cache. If the cache is full, the least used element is deleted.
  3. Cache access:passgetMethods access elements in the cache, and after access, the elements will be marked as recently used.
  4. LRU Deletion Mechanism: When the capacity reaches the upper limit, delete the element at the head of the linked list, that is, the element that is least recently used.

7. Summary

  • LRU Cacheis a common cache replacement strategy used to process data in a finite-sized cache. It ensures that the least used elements are removed by tracking the order in which elements are used.
  • This article introduces how to use itLinkedHashMapImplement LRU caching in Java. Through rational useLinkedHashMapThe order feature of  can maintain the order of elements when accessing the cache, ensuring that we can delete the least used elements when the cache is full.

This implementation is a simplified version suitable for use in small cache scenarios. If more complex cache control (such as concurrency support, etc.) is required, it can be further optimized and expanded.

The above is the detailed explanation of the code that implements LRU cache in Java. For more information about Java LRU cache, please pay attention to my other related articles!