1. Overview
(I) Necessity of caching
As the number of system visits increases, databases often become performance bottlenecks. In order to reduce the pressure on the database and improve the system's response speed, we can optimize system performance through cache.
(II) Cache Strategy
Common caching strategies include:
- Read and write separation: Spread the read operation to the slave node to avoid excessive pressure from the master node.
- Library and table: Spread read and write operations to multiple nodes to avoid excessive pressure on a single node.
- Cache system: Use caching systems (such as Redis, Ehcache) to store frequently accessed data and reduce direct access to the database.
(III) Spring Cache
Spring 3.1 introduces annotation-based caching technology, through@Cacheable
The annotations simplify the cache logic and support multiple cache implementations. Spring Cache features include:
- With a small amount of configuration annotations, existing code can support caching.
- Supports out-of-the-box, and caches are used without installing and deploying additional third-party components.
- Supports Spring Expression Language (SpEL), which can use any attribute or method of an object to define cached keys and conditions.
- Support AspectJ and implement cache support for any method through it.
- Supports custom keys and custom cache managers, with considerable flexibility and scalability.
2. Notes
(I) @Cacheable
@Cacheable
Annotations are used on methods and cache the execution results of methods. The execution process is as follows:
- Cache of the execution result of the judgment method. If so, the cached result will be returned directly.
- Execute the method to obtain the method results.
- Depending on whether the cache conditions are met. If satisfied, the cache method result is cached to cache.
- Returns the method result.
Common properties:
-
cacheNames
: cache name. Required.[]
Array, you can fill in multiple cache names. -
key
: cached key. Allow empty. If empty, all parameters of the default method are combined. If it is not empty, you need to configure it according to SpEL. For example,@Cacheable(value = "users", key = "#id")
, use method parametersid
The value of , is used as the cached key . -
condition
: Based on the method parameter entry, determine the conditions to be cached. Allow empty. If it is not empty, you need to configure it according to SpEL. For example,@Cacheable(condition="#id > 0")
, need to be passed inid
Greater than zero. -
unless
: Based on the method return, the condition that does not cache is determined. Allow empty. If it is not empty, you need to configure it according to SpEL. For example,@Cacheable(unless="#result == null")
, if the result isnull
, no cache is performed.
Not commonly used properties:
-
keyGenerator
: Custom key generator KeyGenerator Bean's name. Allow empty. If set,key
Ineffective. -
cacheManager
: The name of the CacheManager Bean for custom cache manager. Allow empty. Generally, it is not filled in unless there are multiple CacheManager Beans. -
cacheResolver
: The name of the custom cache resolver CacheResolver Bean. Allow empty. -
sync
: Whether to execute the method synchronously if the cache is not obtained. Default isfalse
, means no synchronization is required. If set totrue
, then when executing the method, locking will be performed to ensure that at the same time, there is only one method being executed, and other threads are blocked and waiting.
(II) @CachePut
@CachePut
Annotations are used on methods and cache the execution results of methods. and@Cacheable
Differently, its execution process is as follows:
- Execute the method to obtain the method results. That is, the method will be executed regardless of whether there is a cache or not.
- Depending on whether the cache conditions are met. If satisfied, the cache method result is cached to cache.
- Returns the method result.
Generally speaking,@Cacheable
Coupled with read operations, passive write of cache;@CachePut
Configure write operations to realize cached active write.
(III) @CacheEvict
@CacheEvict
Annotations are used on methods to delete cache. compared to@CachePut
, it has two additional properties:
-
allEntries
: Whether to delete the cache name (cacheNames
Under ) , the corresponding caches of all keys. Default isfalse
, only delete the cache of the specified key. -
beforeInvocation
: Whether to delete the cache before the method is executed. Default isfalse
, delete the cache after the method is executed.
(IV) @Caching
@Caching
Annotations are used in methods, multiple methods can be used in combination@Cacheable
、@CachePut
、@CacheEvict
annotation. Not very commonly used, can be ignored for the time being.
(V) @CacheConfig
@CacheConfig
Annotations are used on classes and share the following four attribute configurations:
cacheNames
keyGenerator
cacheManager
cacheResolver
(VI) @EnableCaching
@EnableCaching
Annotations are used to mark the Spring Cache function to enable, so be sure to add them.
3. Spring Boot Integration
(I) Dependence
In Spring Boot, it is providedspring-boot-starter-cache
library to implement the automated configuration of Spring Cache, throughCacheAutoConfiguration
Configuration class.
(II) Cache tools and frameworks
In Java backend development, common caching tools and frameworks are listed as follows:
- Local cache:Guava LocalCache, Ehcache, Caffeine. Ehcache has more features and Caffeine's performance is better than Guava LocalCache.
- Distributed Cache: Redis, Memcached, Tair. Redis is the most mainstream and commonly used.
(III) Automatic configuration
Among these cache schemes,spring-boot-starter-cache
How do you know which one to use? By default, Spring Boot will automatically determine which cache scheme to use in the following order and create the corresponding CacheManager cache manager.
private static final Map<CacheType, Class<?>> MAPPINGS; static { Map<CacheType, Class<?>> mappings = new EnumMap<>(); (, ); (, ); (, ); (, ); (, ); (, ); (, ); (, ); (, ); (, ); MAPPINGS = (mappings); }
In the worst case, useSimpleCacheConfiguration
. Because automatic judgment may be different from the cache scheme we want to use, we can manually configure it at this timeSpecify the type.
IV. Ehcache Example
(I) Introduce dependencies
existIn the file, introduce related dependencies.
<dependency> <groupId></groupId> <artifactId>ehcache</artifactId> </dependency>
(II) Application configuration file
existresources
In the directory, createConfiguration file. The configuration is as follows:
spring: cache: type: ehcache
(III) Ehcache configuration file
existresources
In the directory, createConfiguration file. The configuration is as follows:
<ehcache> <cache name="users" maxElementsInMemory="1000" timeToLiveSeconds="60" memoryStoreEvictionPolicy="LRU"/> </ehcache>
(IV) Application
createClass, code is as follows:
@SpringBootApplication @EnableCaching public class Application { }
(V) UserDO
exist.
Under the package path, createclass, user DO. The code is as follows:
@TableName(value = "users") public class UserDO { private Integer id; private String username; private String password; private Date createTime; @TableLogic private Integer deleted; }
(VI) UserMapper
exist.
Under the package path, createUserMapper
interface. The code is as follows:
@Repository @CacheConfig(cacheNames = "users") public interface UserMapper extends BaseMapper<UserDO> { @Cacheable(key = "#id") UserDO selectById(Integer id); @CachePut(key = "#") default UserDO insert0(UserDO user) { (user); return user; } @CacheEvict(key = "#id") int deleteById(Integer id); }
(VII) UserMapperTest
createUserMapperTest
Test class, let's test the simple oneUserMapper
every operation. The core code is as follows:
@RunWith() @SpringBootTest(classes = ) public class UserMapperTest { private static final String CACHE_NAME_USER = "users"; @Autowired private UserMapper userMapper; @Autowired private CacheManager cacheManager; @Test public void testCacheManager() { (cacheManager); } @Test public void testSelectById() { Integer id = 1; UserDO user = (id); ("user:" + user); ("Cache as empty", (CACHE_NAME_USER).get((), )); user = (id); ("user:" + user); } @Test public void testInsert() { UserDO user = new UserDO(); (().toString()); ("nicai"); (new Date()); (0); userMapper.insert0(user); ("Cache as empty", (CACHE_NAME_USER).get((), )); } @Test public void testDeleteById() { UserDO user = new UserDO(); (().toString()); ("nicai"); (new Date()); (0); userMapper.insert0(user); ("Cache as empty", (CACHE_NAME_USER).get((), )); (()); ("The cache is not empty", (CACHE_NAME_USER).get((), )); } }
V. Redis example
(I) Introduce dependencies
existIn the file, introduce related dependencies.
<dependency> <groupId></groupId> <artifactId>spring-boot-starter-data-redis</artifactId> <exclusions> <exclusion> <groupId></groupId> <artifactId>lettuce-core</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId></groupId> <artifactId>jedis</artifactId> </dependency>
(II) Application configuration file
existresources
In the directory, createConfiguration file. The configuration is as follows:
spring: redis: host: 127.0.0.1 port: 6379 password: database: 0 timeout: 0 jedis: pool: max-active: 8 max-idle: 8 min-idle: 0 max-wait: -1 cache: type: redis
(III) Application
and4.4 Application
Consistent.
(IV) UserDO
and4.5 UserDO
Consistent. The difference is that it needs to beUserDO
accomplishSerializable
interface.
(V) UserMapper
and4.6 UserMapper
Consistent.
(VI) UserMapperTest
and4.7 UserMapperTest
Basically consistent.
6. Interview answers and answers
(I) Necessity of caching
question:Why do you need to use cache?
answer: As the number of system access increases, databases often become performance bottlenecks. Caching can reduce the pressure on the database and increase the system's response speed. Caching systems (such as Redis, Ehcache) can store frequently accessed data and reduce direct access to the database.
(II) Spring Cache
question:What is Spring Cache?
answer: Spring Cache is an annotation-based caching technology introduced in Spring 3.1, through@Cacheable
The annotations simplify the cache logic and support multiple cache implementations. Spring Cache features include: using a small number of configuration annotations to enable existing code to support caching; supporting out-of-the-box; supporting Spring Expression Language (SpEL); supporting AspectJ; supporting custom keys and custom cache managers.
(III) @Cacheable
question:@Cacheable
What are the methods and attributes of annotations?
answer:@Cacheable
Annotations are used on methods and cache the execution results of methods. Common properties include:cacheNames
(Cache name, required),key
(Cached key, allowed to empty),condition
(Based on the method to enter the parameters, determine the conditions to be cached),unless
(Based on the method return, determine the condition that does not cache).
(IV) @CachePut
question:@CachePut
What is the function of annotation?
answer:@CachePut
Annotations are used on methods and cache the execution results of methods. and@Cacheable
Different, its execution process is as follows: execute the method to obtain the method result; according to whether the cache condition is met, if it is met, the method result will be cached to the cache; return the method result.
(V) @CacheEvict
question:@CacheEvict
What is the function of annotation?
answer:@CacheEvict
Annotations are used on methods to delete cache. compared to@CachePut
, it has two additional properties:allEntries
(Whether to delete all keys corresponding to the cache name?),beforeInvocation
(Whether to delete the cache before the method is executed).
(VI) Spring Boot Cache Integration
question:How to integrate cache in Spring Boot?
answer: In Spring Boot, it can be introducedspring-boot-starter-cache
Dependencies to integrate cache. Spring Boot will automatically determine which cache scheme to use, and can also be usedSpecify manually.
(VII) Ehcache and Redis
question:What is the difference between Ehcache and Redis?
answer:Ehcache is a pure Java in-process caching framework, with fast and lean features, and is the default CacheProvider in Hibernate. Redis is a memory database based on key-value pairs, supporting multiple data types, such as strings, hashes, lists, collections, ordered collections, etc. Ehcache is suitable for local caching, and Redis is suitable for distributed caching.
(8) Cache Strategy
question:What are the common caching strategies?
answer: Common caching strategies include: read and write separation (spread the read operations to the slave node to avoid excessive pressure from the master node), database and table (spread the read and write operations to multiple nodes to avoid excessive pressure from a single node), cache system (using cache systems (such as Redis, Ehcache) to store frequently accessed data and reduce direct access to the database).
(9) Cache breakdown
question:What is cache breakdown? How to solve it?
answer: Cache breakdown refers to multiple threads accessing the same cache key at the same time, resulting in cache failure and multiple threads querying the database at the same time, causing database pressure. Solutions include: using mutex locks (such as Redis's SETNX command), ensuring that only one thread queries the database at the same time; using cache preheating, preloading the cache when the system starts.
(10) Cache penetration
question:What is cache penetration? How to solve it?
answer: Cache penetration refers to querying a non-existent data, resulting in the cache and database being hit, and directly querying the database. Solutions include: using a Bloom filter to quickly determine whether the data exists; also cache non-existent data and set a short expiration time.
(11) Cache Avalanche
question:What is a cache avalanche? How to solve it?
answer: A cache avalanche refers to a large number of caches that fail at the same time, causing a large number of requests to directly query the database, causing database pressure. Solutions include: using cache preheating, preloading the cache when the system starts; setting different expiration times to avoid large amounts of cache failure at the same time.
(12) Expiration strategy of cache
question:What are the expired strategies for cache?
answer: The cached expiration strategy includes: setting a fixed expiration time (such as Redis's EXPIRE command), dynamically adjusting the expiration time according to the access frequency, and dynamically adjusting the expiration time according to the importance of the data.
(13) Cache elimination strategy
question:What are the cache elimination strategies?
answer: The cached elimination strategies include: first-in-first-out (FIFO), least recently used (LRU), least frequently used (LFU), and random elimination.
(14) Serialization of cache
question:What are the ways to serialize caches?
answer: The serialization methods of cache include: Java serialization, JSON serialization, and Protobuf serialization. Java serialization is simple, but its performance is poor; JSON serialization has good performance, but it needs to manually implement serialization and deserialization; Protobuf serialization has the best performance, but it needs to define a .proto file.
(15) Distributed locks for cache
question:How to implement cached locks in a distributed environment?
answer: In a distributed environment, you can use Redis's SETNX command to implement distributed locks. The SETNX command can ensure that only one thread acquires the lock at the same time, thereby avoiding cache breakdown and other problems.
(16) Cache transactions
question: Does the cache support transactions?
answer: The cache itself does not support transactions, but can be implemented by combining it with database transactions. For example, after the database transaction is committed, the cache is updated; after the database transaction is rolled back, the cache is rolled back.
(17) Cache monitoring
question:How to monitor the usage of cache?
answer: You can use Redis's INFO command to monitor the usage of cache, including memory usage, hit rate, number of expired keys, etc. Third-party monitoring tools such as Prometheus and Grafana are also available.
(18) Cache optimization
question:How to optimize cache performance?
answer: Methods to optimize cache performance include: selecting appropriate cache strategies, using efficient serialization methods, reasonably setting cache expiration time and elimination strategies, using distributed locks to avoid cache breakdown, monitoring cache usage and timely adjustments.
(19) Cache hit rate
question:How to calculate the cache hit rate?
answer: The cached hit rate can be calculated by the following formula: hit rate = hit number / (hit number + missed number). The number of hits and misses can be obtained through the monitoring tool.
(Twenty) Cache Scenario
question:What scenarios does caching work for?
answer: Cache is suitable for the following scenarios: frequently accessed data, such as user information, product information, etc.; data changes infrequently, such as configuration information; data with low real-time requirements, such as statistical information.
(21) Limitations of cache
question:What are the limitations of cache?
answer: The limitations of cache include: cache data is inconsistent with database data, cache breakdown, cache penetration, cache avalanche and other problems, cache memory is limited, cache serialization and deserialization may affect performance, and cache distributed lock implementation is complex.
(22) Future development trends of cache
question:What is the future development trend of cache technology?
answer: The future development trends of caching technology include: deep integration with databases, supporting transactions and consistency; supporting more data types and query methods; providing better performance and scalability; and providing more convenient monitoring and management tools.
The above is a detailed explanation of the introduction to Spring Boot Cache, interview answers and answers. I hope it will be helpful to beginners and I wish you all a happy study!
This is the end of this article about getting started with Spring Boot Cache. For more related Spring Boot Cache content, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!