1. Basic ideas of LRU caching
The LRU cache is a finite size cache. Whenever the cache capacity reaches the upper limit, the system will automatically delete the least recently used cache item. LRU cache is often used in data storage, graphics processing, operating systems and other fields.
Key points of LRU cache:
- Use the most recent data: The cache will save the recently visited data.
- Eliminate the least used data: When the cache space is full, delete the least recently used item.
- Maintain access order: Usually the access order of cached items is maintained through linked lists.
2. The core steps of LRU cache implementation
- Cache capacity limit: The cache size is fixed. When the capacity reaches the upper limit, the minimum cache item needs to be deleted.
-
Quick search and delete operations:use
Map
The data structure provides fast search and deletion functions, and uses two-way linked lists to maintain the access order of elements. - Operation sequence: Each time the cache is accessed, the element is moved to the head of the linked list, indicating that it is recently used. When the capacity exceeds, delete the elements at the end of the linked list.
3. Java implementation of LRU cache
We will useLinkedHashMap
to implement LRU caching.LinkedHashMap
Maintain the insertion order of elements, you can set themaccessOrder
fortrue
to ensure that elements are maintained in order of access.
IV. Implement code
import .*; public class LRUCache<K, V> { private final int capacity; private final Map<K, V> cache; // Constructor, initialize capacity and create a LinkedHashMap public LRUCache(int capacity) { = capacity; // LinkedHashMap allows the insertion order to be maintained through access order = new LinkedHashMap<>(capacity, 0.75f, true); } // Get the value in the cache public V get(K key) { if (!(key)) { return null; // If this item is not in the cache, return null } return (key); // If this item is present in the cache, return its value and move it to the end (represents the most recent use) } // Add elements to cache public void put(K key, V value) { if (() >= capacity) { // If the cache is full, remove the least used element (i.e. the element at the head of the linked list) Iterator<<K, V>> iterator = ().iterator(); if (()) { (); (); } } (key, value); // Put new elements into cache } // Print the content in the cache public void printCache() { (cache); } public static void main(String[] args) { LRUCache<Integer, String> lruCache = new LRUCache<>(3); // Add elements to the cache (1, "A"); (2, "B"); (3, "C"); // Print cached content (); // Output: {1=A, 2=B, 3=C} // Access some cache items (1); // Visited 1 (4, "D"); // Insert new element, the capacity is full // Print cached content (); // Output: {3=C, 1=A, 4=D} (2 removed, minimum use) // Continue to access some cache items (3); // Visited 3 (5, "E"); // Insert new element // Print cached content (); // Output: {1=A, 3=C, 5=E} (4 removed) } }
5. Code interpretation
-
LRUCache
kind:- use
LinkedHashMap
To save cached data. Its constructor usestrue
As the third parameteraccessOrder
to ensure that caches are arranged in order of access. -
get
Method is used to obtain data in cache. If the data exists, it will automatically move the data to the most recently used location (the end of the linked list). -
put
Method is used to add data to the cache. If the cache is full, the least used data (header element of the linked list) is deleted.
- use
-
Cache capacity control:
- When the number of cached elements reaches the specified capacity,
put
The method will remove elements in the header of the linked list through an iterator to ensure that the cache will not exceed the maximum capacity.
- When the number of cached elements reaches the specified capacity,
-
printCache
method:- Used to print the contents of the current cache, help debug and view cache status.
-
main
method:- Demonstrate how to use LRU cache with examples.
- Added multiple elements to the cache, then accessed some elements and viewed the remaining content in the cache.
6. How LRU cache works
-
Cache initialization: When initialized
LRUCache
Specify the maximum capacity of the cache. -
Cache Add:pass
put
Methods add elements to the cache. If the cache is full, the least used element is deleted. -
Cache access:pass
get
Methods access elements in the cache, and after access, the elements will be marked as recently used. - LRU Deletion Mechanism: When the capacity reaches the upper limit, delete the element at the head of the linked list, that is, the element that is least recently used.
7. Summary
- LRU Cacheis a common cache replacement strategy used to process data in a finite-sized cache. It ensures that the least used elements are removed by tracking the order in which elements are used.
- This article introduces how to use it
LinkedHashMap
Implement LRU caching in Java. Through rational useLinkedHashMap
The order feature of can maintain the order of elements when accessing the cache, ensuring that we can delete the least used elements when the cache is full.
This implementation is a simplified version suitable for use in small cache scenarios. If more complex cache control (such as concurrency support, etc.) is required, it can be further optimized and expanded.
The above is the detailed explanation of the code that implements LRU cache in Java. For more information about Java LRU cache, please pay attention to my other related articles!